12 research outputs found

    Rendering along the Hilbert Curve

    Full text link
    Based on the seminal work on Array-RQMC methods and rank-1 lattice sequences by Pierre L'Ecuyer and collaborators, we introduce efficient deterministic algorithms for image synthesis. Enumerating a low discrepancy sequence along the Hilbert curve superimposed on the raster of pixels of an image, we achieve noise characteristics that are desirable with respect to the human visual system, especially at very low sampling rates. As compared to the state of the art, our simple algorithms neither require randomization, nor costly optimization, nor lookup tables. We analyze correlations of space-filling curves and low discrepancy sequences, and demonstrate the benefits of the new algorithms in a professional, massively parallel light transport simulation and rendering system

    Quasi-Monte Carlo Algorithms (not only) for Graphics Software

    Full text link
    Quasi-Monte Carlo methods have become the industry standard in computer graphics. For that purpose, efficient algorithms for low discrepancy sequences are discussed. In addition, numerical pitfalls encountered in practice are revealed. We then take a look at massively parallel quasi-Monte Carlo integro-approximation for image synthesis by light transport simulation. Beyond superior uniformity, low discrepancy points may be optimized with respect to additional criteria, such as noise characteristics at low sampling rates or the quality of low-dimensional projections

    Intelligent Assistants

    Get PDF
    Intelligent assistants are an increasingly commonplace class of information systems spanning a broad range of form and complexity. But what characterizes an intelligent assistant, and how do we design better assistants? In the paper, the authors contribute to scientific research in the domain of intelligent assistants in three steps, each building on the previous. First, they investigate the historical context of assistance as human work. By examining qualitative studies regarding the work of human assistants, the authors inductively derive concepts crucial to modeling the context of assistance. This analysis informs the second step, in which they develop a conceptual typology of intelligent assistants using 111 published articles. This typology explicates the characteristics (what or how) of intelligent assistants and their use context (who or which). In the third and final step, the authors utilize this typology to shed light on historical trends and patterns in design and evaluation of intelligent assistants, reflect on missed opportunities, and discuss avenues for further exploration

    Correction to: Intelligent Assistants

    Get PDF

    The Iray Light Transport Simulation and Rendering System

    Full text link
    While ray tracing has become increasingly common and path tracing is well understood by now, a major challenge lies in crafting an easy-to-use and efficient system implementing these technologies. Following a purely physically-based paradigm while still allowing for artistic workflows, the Iray light transport simulation and rendering system allows for rendering complex scenes by the push of a button and thus makes accurate light transport simulation widely available. In this document we discuss the challenges and implementation choices that follow from our primary design decisions, demonstrating that such a rendering system can be made a practical, scalable, and efficient real-world application that has been adopted by various companies across many fields and is in use by many industry professionals today

    Optimal topological simplification of discrete functions on surfaces

    Get PDF
    We solve the problem of minimizing the number of critical points among all functions on a surface within a prescribed distance {\delta} from a given input function. The result is achieved by establishing a connection between discrete Morse theory and persistent homology. Our method completely removes homological noise with persistence less than 2{\delta}, constructively proving the tightness of a lower bound on the number of critical points given by the stability theorem of persistent homology in dimension two for any input function. We also show that an optimal solution can be computed in linear time after persistence pairs have been computed.Comment: 27 pages, 8 figure

    Quasi-Monte Carlo light transport simulation by efficient ray tracing

    No full text
    Photorealistic image synthesis can be described by a path integral. This integral is numerically approximated by summing up contributions of transport paths that connect light sources and sensors like e.g. a camera or the eye. The paths are trajectories of Markov processes, whose edges are straight lines along rays of light and whose vertices are light scattering events. The goal of this thesis was to accelerate the simulation of light transport, to find new algorithms and data structures to efficiently trace rays, and to better approximate the distribution of light by simultaneously simulating an ensemble of paths instead of single trajectories, using quasi-Monte-Carlo methods. We first present new data structures and heuristics that feature a smaller memory footprint at improved numerical precision. In addition it is possible to ray trace even massive scenes in a strictly limited, a priori fixed, memory block using rapid construction techniques that allow to rebuild the complete data structure at interactive frame rates. All efforts were combined in a unified framework that further allows one to build the acceleration hierarchy using an on demand policy and optionally balance the construction time versus the ray intersection time. Besides finding faster ray tracing algorithms, the total number of rays to be shot was reduced by mathematical means. By simplifying complicated mathematical schemes in a non-obvious way, the time complexity of the quasi-Monte-Carlo simulation process was reduced. When concentrating on the fact that the underlying Fredholm integral equation is in fact of low dimensional structure if not solved by the Neumann series, the resulting algorithms are simpler and in addition much more efficient. The combination of these new techniques allows photorealistic image synthesis in almost realtime. The results are demonstrated by several academic and industrial applications

    Instant Ray Tracing: The Bounding Interval Hierarchy

    No full text
    We introduce a new ray tracing algorithm that exploits the best of previous methods: Similar to bounding volume hierarchies the memory of the acceleration data structure is linear in the number of objects to be ray traced and can be predicted prior to construction, while the traversal of the hierarchy is as efficient as the one of kd-trees. The construction algorithm can be considered a variant of quicksort and for the first time is based on a global space partitioning heuristic, which is much cheaper to evaluate than the classic surface area heuristic. Compared to spatial partitioning schemes only a fraction of the memory is used and a higher numerical precision is intrinsic. The new method is simple to implement and its high performance is demonstrated by extensive measurements including massive as well as dynamic scenes, where we focus on the total time to image including the construction cost rather than on only frames per second

    Gutachter:

    No full text
    Photorealistic image synthesis can be described by a path integral. This integral is numerically approximated by summing up contributions of transport paths that connect light sources and sensors like e.g. a camera or the eye. The paths are trajectories of Markov processes, whose edges are straight lines along rays of light and whose vertices are light scattering events. The goal of this thesis was to accelerate the simulation of light transport, nd new algorithms and data structures to e ciently trace rays, and to better approximate the distribution of light by simultaneously simulating an ensemble of paths instead of single trajectories, using quasi-Monte-Carlo methods. The main results of this thesis are contributions to both computer science and mathematics. We rst present new data structures and heuristics that feature a smaller memory footprint at improved numerical precision. In addition it is possible to ray trace even massive scenes in a strictly limited, a priori xed, memory block using rapid construction techniques that allow to rebuild the complete data structure at interactive frame rates. All e orts were combined in a uni ed framework that further allows one to build the acceleration hierarchy using an on demand policy and optionally balance the construction time versus the ray intersection time. Besides nding faster ray tracing algorithms, the total number of rays to be shot was reduced by mathematica

    Joint recommendations of the project group Biostatistical DNA Calculations and the Stain Commission on the Biostatistical Evaluation of Forensic DNA Analytical Findings with Fully Continuous Models (FCM)

    Get PDF
    Biostatistical evaluation of DNA profiles supports the jurisdiction in the assessment of the evidentiary value of a DNA stain. In order to ensure comparability of such calculations on the basis of established scientific standards, a set of recommendations have already been brought to national consensus in the past. With the introduction of fully continuous models (FCMs) for probabilistic genotyping, which among other things take account of an electropherogram's signal intensities, these recommendations have to be amended. Fully continuous models allow a biostatistical evaluation of complex DNA profiles with presumed allelic drop-in and drop-out events, as well as probability based deductions of individual DNA profiles contributing to a mixture (deconvolution). This publication provides recommendations on the use of FCM-based software and the reporting of fully continuous likelihood ratios (LRfc). It recommends an FCM-based calculation for evaluating the role of an alleged contributor to DNA evidence which is difficult to interpret. Thus it replaces the previous approach of a binary calculation with the exclusion of selected loci, which was previously considered acceptable in exceptional cases. The application of FCM-based software requires comprehensive user training as well as validation and verification according to the software developer's instructions. Furthermore, considerations on LRfc thresholds are described in order to ensure compatibility among probabilistic genotyping results of different origins
    corecore